A Modified Newton Method for Multilinear PageRank

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multilinear PageRank

In this paper, we first extend the celebrated PageRank modification to a higher-order Markov chain. Although this system has attractive theoretical properties, it is computationally intractable for many interesting problems. We next study a computationally tractable approximation to the higher-order PageRank vector that involves a system of polynomial equations called multilinear PageRank. This...

متن کامل

A Modified Newton Method for Minimization I

Some promising ideas for minimizing a nonlinear function, whose first and second derivatives are given, by a modified Newton method, were introduced by Fiacco and McCormick (Ref. 1). Unfortunately, in developing a method around these ideas, Fiacco and McCormick used a potentially unstable, or even impossible, matrix factorization. Using some recently developed techniques for factorizing an inde...

متن کامل

A Modified Newton Method for Solving Non-linear Algebraic Equations

The Newton algorithm based on the “continuation” method may be written as being governed by the equation ( ) j x t + 1 ( ) 0, ij i j B F x − = where Fi (xj) = 0, i, j = 1, ...n are nonlinear algebraic equations (NAEs) to be solved, and Bij = ∂Fi /∂xj is the corresponding Jacobian matrix. It is known that the Newton’s algorithm is quadratically convergent; however, it has some drawbacks, such as...

متن کامل

A Modified Regularized Newton Method for Unconstrained Nonconvex Optimization

In this paper, we present a modified regularized Newton method for the unconstrained nonconvex optimization by using trust region technique. We show that if the gradient and Hessian of the objective function are Lipschitz continuous, then the modified regularized Newton method (M-RNM) has a global convergence property. Numerical results show that the algorithm is very efficient.

متن کامل

A Modified Orthant-Wise Limited Memory Quasi-Newton Method

where U = V k−mV k−m+1 · · ·V k−1. For the L-BFGS, we need not explicitly store the approximated inverse Hessian matrix. Instead, we only require matrix-vector multiplications at each iteration, which can be implemented by a twoloop recursion with a time complexity of O(mn) (Jorge & Stephen, 1999). Thus, we only store 2m vectors of length n: sk−1, sk−2, · · · , sk−m and yk−1,yk−2, · · · ,yk−m w...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Taiwanese Journal of Mathematics

سال: 2018

ISSN: 1027-5487

DOI: 10.11650/tjm/180303